AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Wikipedia corpus training

# Wikipedia corpus training

Gn Bert Small Cased
MIT
A BERT model pretrained for Guarani (6 layers, case-sensitive). Trained on Wikipedia + Wiktionary (approx. 800k tokens).
Large Language Model Transformers Other
G
mmaguero
26
0
Mt5 Zh Ja En Trimmed
A multilingual translation model fine-tuned based on mt5-base, supporting mutual translation between Chinese, Japanese, and English
Machine Translation Transformers Supports Multiple Languages
M
K024
146
57
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase